Multivariate Density Estimation with Deep Neural Mixture Models
نویسندگان
چکیده
Abstract Albeit worryingly underrated in the recent literature on machine learning general (and, deep particular), multivariate density estimation is a fundamental task many applications, at least implicitly, and still an open issue. With few exceptions, neural networks (DNNs) have seldom been applied to estimation, mostly due unsupervised nature of task, (especially) need for constrained training algorithms that ended up realizing proper probabilistic models satisfy Kolmogorov’s axioms. Moreover, spite well-known improvement terms modeling capabilities yielded by mixture over plain single-density statistical estimators, no mixtures DNN-based component densities investigated so far. The paper fills this gap extending our previous work (NMMs) DNN mixtures. A maximum-likelihood (ML) algorithm estimating Deep NMMs (DNMMs) handed out, which satisfies numerically combination hard soft constraints aimed ensuring satisfaction class probability functions can be modeled any degree precision via DNMMs formally defined. procedure automatic selection DNMM architecture, as well hyperparameters its ML algorithm, presented (exploiting DNMM). Experimental results univariate data are reported on, corroborating effectiveness approach superiority most popular techniques.
منابع مشابه
Repairing Faulty Mixture Models using Density Estimation
Previous work in mixture model clustering has focused primarily on the issue of model selection. Model scoring functions (including penalized likelihood and Bayesian approximations) can guide a search of the model parameter and structure space. Relatively little research has addressed the issue of how to move through this space. Local optimization techniques, such as expectation maxi-mization, ...
متن کاملDefault priors for density estimation with mixture models
The infinite mixture of normals model has become a popular method for density estimation problems. This paper proposes an alternative hierarchical model that leads to hyperparameters that can be interpreted as the location, scale and smoothness of the density. The priors on other parts of the model have little effect on the density estimates and can be given default choices. Automatic Bayesian ...
متن کاملDensity Estimation by Mixture Models with Smoothing Priors
In the statistical approach for self-organizing maps (SOMs), learning is regarded as an estimation algorithm for a gaussian mixture model with a gaussian smoothing prior on the centroid parameters. The values of the hyperparameters and the topological structure are selected on the basis of a statistical principle. However, since the component selection probabilities are fixed to a common value,...
متن کاملSemi-Parametric Estimation for Conditional Independence Multivariate Finite Mixture Models
Abstract: The conditional independence assumption for nonparametric multivariate finite mixture models, a weaker form of the well-known conditional independence assumption for random effects models for longitudinal data, is the subject of an increasing number of theoretical and algorithmic developments in the statistical literature. After presenting a survey of this literature, including an in-...
متن کاملEstimation for Conditional Independence Multivariate Finite Mixture Models
The conditional independence assumption for nonparametric multivariate finite mixture models may be considered to be a weaker form of the well-known conditional independence assumption for random effects models for longitudinal data. After summarizing important recent identifiability results, this article describes and extends an algorithm for estimation of the parameters in these models. The a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Processing Letters
سال: 2023
ISSN: ['1573-773X', '1370-4621']
DOI: https://doi.org/10.1007/s11063-023-11196-2